188 research outputs found
Shared inputs, entrainment, and desynchrony in elliptic bursters: from slow passage to discontinuous circle maps
What input signals will lead to synchrony vs. desynchrony in a group of
biological oscillators? This question connects with both classical dynamical
systems analyses of entrainment and phase locking and with emerging studies of
stimulation patterns for controlling neural network activity. Here, we focus on
the response of a population of uncoupled, elliptically bursting neurons to a
common pulsatile input. We extend a phase reduction from the literature to
capture inputs of varied strength, leading to a circle map with discontinuities
of various orders. In a combined analytical and numerical approach, we apply
our results to both a normal form model for elliptic bursting and to a
biophysically-based neuron model from the basal ganglia. We find that,
depending on the period and amplitude of inputs, the response can either appear
chaotic (with provably positive Lyaponov exponent for the associated circle
maps), or periodic with a broad range of phase-locked periods. Throughout, we
discuss the critical underlying mechanisms, including slow-passage effects
through Hopf bifurcation, the role and origin of discontinuities, and the
impact of noiseComment: 17 figures, 40 page
A simple mechanism for higher-order correlations in integrate-and-fire neurons
The collective dynamics of neural populations are often characterized in
terms of correlations in the spike activity of different neurons. Open
questions surround the basic nature of these correlations. In particular, what
leads to higher-order correlations -- correlations in the population activity
that extend beyond those expected from cell pairs? Here, we examine this
question for a simple, but ubiquitous, circuit feature: common fluctuating
input arriving to spiking neurons of integrate-and-fire type. We show that
leads to strong higher-order correlations, as for earlier work with discrete
threshold crossing models. Moreover, we find that the same is true for another
widely used, doubly-stochastic model of neural spiking, the linear-nonlinear
cascade. We explain the surprisingly strong connection between the collective
dynamics produced by these models, and conclude that higher-order correlations
are both broadly expected and possible to capture with surprising accuracy by
simplified (and tractable) descriptions of neural spiking
The sign rule and beyond: Boundary effects, flexibility, and noise correlations in neural population codes
Over repeat presentations of the same stimulus, sensory neurons show variable
responses. This "noise" is typically correlated between pairs of cells, and a
question with rich history in neuroscience is how these noise correlations
impact the population's ability to encode the stimulus. Here, we consider a
very general setting for population coding, investigating how information
varies as a function of noise correlations, with all other aspects of the
problem - neural tuning curves, etc. - held fixed. This work yields unifying
insights into the role of noise correlations. These are summarized in the form
of theorems, and illustrated with numerical examples involving neurons with
diverse tuning curves. Our main contributions are as follows.
(1) We generalize previous results to prove a sign rule (SR) - if noise
correlations between pairs of neurons have opposite signs vs. their signal
correlations, then coding performance will improve compared to the independent
case. This holds for three different metrics of coding performance, and for
arbitrary tuning curves and levels of heterogeneity. This generality is true
for our other results as well.
(2) As also pointed out in the literature, the SR does not provide a
necessary condition for good coding. We show that a diverse set of correlation
structures can improve coding. Many of these violate the SR, as do
experimentally observed correlations. There is structure to this diversity: we
prove that the optimal correlation structures must lie on boundaries of the
possible set of noise correlations.
(3) We provide a novel set of necessary and sufficient conditions, under
which the coding performance (in the presence of noise) will be as good as it
would be if there were no noise present at all.Comment: 41 pages, 5 figure
Structured chaos shapes spike-response noise entropy in balanced neural networks
Large networks of sparsely coupled, excitatory and inhibitory cells occur
throughout the brain. A striking feature of these networks is that they are
chaotic. How does this chaos manifest in the neural code? Specifically, how
variable are the spike patterns that such a network produces in response to an
input signal? To answer this, we derive a bound for the entropy of multi-cell
spike pattern distributions in large recurrent networks of spiking neurons
responding to fluctuating inputs. The analysis is based on results from random
dynamical systems theory and is complimented by detailed numerical simulations.
We find that the spike pattern entropy is an order of magnitude lower than what
would be extrapolated from single cells. This holds despite the fact that
network coupling becomes vanishingly sparse as network size grows -- a
phenomenon that depends on ``extensive chaos," as previously discovered for
balanced networks without stimulus drive. Moreover, we show how spike pattern
entropy is controlled by temporal features of the inputs. Our findings provide
insight into how neural networks may encode stimuli in the presence of
inherently chaotic dynamics.Comment: 9 pages, 5 figure
- …